Method: rich.syntax.Syntax.highlight.<locals>.line_tokenize
Calls: 24, Exceptions: 24, Paths: 1Back
Path 1: 24 calls (1.0)
tuple (19491) None (24)
GeneratorExit (24)
1def line_tokenize() -> Iterable[Tuple[Any, str]]:
2                    """Split tokens to one per line."""
3                    assert lexer  # required to make MyPy happy - we know lexer is not None at this point
4
5                    for token_type, token in lexer.get_tokens(code):
6                        while token:
7                            line_token, new_line, token = token.partition("\n")
8                            yield token_type, line_token + new_line